Minimizing Convex Functions by Continuous Descent Methods
نویسندگان
چکیده
We study continuous descent methods for minimizing convex functions, defined on general Banach spaces, which are associated with an appropriate complete metric space of vector fields. We show that there exists an everywhere dense open set in this space of vector fields such that each of its elements generates strongly convergent trajectories.
منابع مشابه
A Generic Convergence Theorem for Continuous Descent Methods in Banach Spaces
We study continuous descent methods for minimizing convex functions defined on general Banach spaces and prove that most of them (in the sense of Baire category) converge.
متن کاملMATHEMATICAL ENGINEERING TECHNICAL REPORTS Discrete L-/M-Convex Function Minimization Based on Continuous Relaxation
We consider the problem of minimizing a nonlinear discrete function with L-/M-convexity proposed in the theory of discrete convex analysis. For this problem, steepest descent algorithms and steepest descent scaling algorithms are known. In this paper, we use continuous relaxation approach which minimizes the continuous variable version first in order to find a good initial solution of a steepes...
متن کاملLecture Notes for IAP 2005 Course Introduction to Bundle Methods
Minimizing a convex function over a convex region is probably the core problem in the Nonlinear Programming literature. Under the assumption of the function of a differentiable function of interest, several methods have been proposed and successively studied. For example, the Steepest Descent Method consists of performing a line search along a descent direction given by minus the gradient of th...
متن کاملRandom Coordinate Descent Methods for Minimizing Decomposable Submodular Functions
Submodular function minimization is a fundamental optimization problem that arises in several applications in machine learning and computer vision. The problem is known to be solvable in polynomial time, but general purpose algorithms have high running times and are unsuitable for large-scale problems. Recent work have used convex optimization techniques to obtain very practical algorithms for ...
متن کاملLinear convergence of epsilon-subgradient descent methods for a class of convex functions
This paper establishes a linear convergence rate for a class of epsilon-subgradient descent methods for minimizing certain convex functions on R. Currently prominent methods belonging to this class include the resolvent (proximal point) method and the bundle method in proximal form (considered as a sequence of serious steps). Other methods, such as the recently proposed descent proximal level m...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2010